Dimensionality Reduction and Learning : Ridge Regression vs . PCA

نویسنده

  • Sham Kakade
چکیده

1 Intro The theme of these two lectures is that for L2 methods we need not work in infinite dimensional spaces. In particular, we can unadaptively find and work in a low dimensional space and achieve about as good results. These results question the need for explicitly working in infinite (or high) dimensional spaces for L2 methods. In contrast, for sparsity based methods (including L1 regularization), such non-adaptive projection methods significantly loose predictive power.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Comparative Study of Locality Preserving Projection and Principle Component Analysis on Classification Performance Using Logistic Regression

There are a variety of classification techniques such as neural network, decision tree, support vector machine and logistic regression. The problem of dimensionality is pertinent to many learning algorithms, and it denotes the drastic raise of computational complexity, however, we need to use dimensionality reduction methods. These methods include principal component analysis (PCA) and locality...

متن کامل

Image Super-resolution via Feature-augmented Random Forest

Recent random-forest (RF)-based image super-resolution approaches inherit some properties from dictionary-learning-based algorithms, but the effectiveness of the properties in RF is overlooked in the literature. In this paper, we present a novel feature-augmented random forest (FARF) for image super-resolution, where the conventional gradient-based features are augmented with gradient magnitude...

متن کامل

Coupling Random Orthonormal Projection with Gaussian Generative Model for Non-Interactive Private Data Release

A key challenge facing the design of differentially private systems in the non-interactive setting is to maintain the utility of the released data for general analytics applications. To overcome this challenge, we propose the PCA-Gauss system that leverages the novel combination of dimensionality reduction and generative model for synthesizing differentially private data. We present multiple al...

متن کامل

Feature dimensionality reduction for example-based image super-resolution

Support vector regression has been proposed in a number of image processing tasks including blind image deconvolution, image denoising and single frame super-resolution. As for other machine learning methods, the training is slow. In this paper, we attempt to address this issue by reducing the feature dimensionality through Principal Component Analysis (PCA). Our single frame supper-resolution ...

متن کامل

Regularized Discriminant Analysis, Ridge Regression and Beyond

Fisher linear discriminant analysis (FDA) and its kernel extension—kernel discriminant analysis (KDA)—are well known methods that consider dimensionality reduction and classification jointly. While widely deployed in practical problems, there are still unresolved issues surrounding their efficient implementation and their relationship with least mean squares procedures. In this paper we address...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010